Optimality, Computation, and Interpretation of Nonnegative Matrix Factorizations
نویسندگان
چکیده
The notion of low rank approximations arises from many important applications. When the low rank data are further required to comprise nonnegative values only, the approach by nonnegative matrix factorization is particularly appealing. This paper intends to bring about three points. First, the theoretical Kuhn-Tucker optimality condition is described in explicit form. Secondly, a number of numerical techniques, old and new, are suggested for the nonnegative matrix factorization problems. Thirdly, the techniques are employed to two real-world applications to demonstrate the difficulty in interpreting the factorizations.
منابع مشابه
A Projected Alternating Least square Approach for Computation of Nonnegative Matrix Factorization
Nonnegative matrix factorization (NMF) is a common method in data mining that have been used in different applications as a dimension reduction, classification or clustering method. Methods in alternating least square (ALS) approach usually used to solve this non-convex minimization problem. At each step of ALS algorithms two convex least square problems should be solved, which causes high com...
متن کاملA Modified Digital Image Watermarking Scheme Based on Nonnegative Matrix Factorization
This paper presents a modified digital image watermarking method based on nonnegative matrix factorization. Firstly, host image is factorized to the product of three nonnegative matrices. Then, the centric matrix is transferred to discrete cosine transform domain. Watermark is embedded in low frequency band of this matrix and next, the reverse of the transform is computed. Finally, watermarked ...
متن کاملSparse and unique nonnegative matrix factorization through data preprocessing
Nonnegative matrix factorization (NMF) has become a very popular technique in machine learning because it automatically extracts meaningful features through a sparse and part-based representation. However, NMF has the drawback of being highly ill-posed, that is, there typically exist many different but equivalent factorizations. In this paper, we introduce a completely new way to obtaining more...
متن کاملAlternating Optimization Method Based on Nonnegative Matrix Factorizations for Deep Neural Networks
The backpropagation algorithm for calculating gradients has been widely used in computation of weights for deep neural networks (DNNs). This method requires derivatives of objective functions and has some difficulties finding appropriate parameters such as learning rate. In this paper, we propose a novel approach for computing weight matrices of fully-connected DNNs by using two types of semi-n...
متن کاملRiordan group approaches in matrix factorizations
In this paper, we consider an arbitrary binary polynomial sequence {A_n} and then give a lower triangular matrix representation of this sequence. As main result, we obtain a factorization of the innite generalized Pascal matrix in terms of this new matrix, using a Riordan group approach. Further some interesting results and applications are derived.
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2004